7,103 research outputs found

    Minimum entropy restoration using FPGAs and high-level techniques

    Get PDF
    One of the greatest perceived barriers to the widespread use of FPGAs in image processing is the difficulty for application specialists of developing algorithms on reconfigurable hardware. Minimum entropy deconvolution (MED) techniques have been shown to be effective in the restoration of star-field images. This paper reports on an attempt to implement a MED algorithm using simulated annealing, first on a microprocessor, then on an FPGA. The FPGA implementation uses DIME-C, a C-to-gates compiler, coupled with a low-level core library to simplify the design task. Analysis of the C code and output from the DIME-C compiler guided the code optimisation. The paper reports on the design effort that this entailed and the resultant performance improvements

    Restoration of star-field images using high-level languages and core libraries

    Get PDF
    Research into the use of FPGAs in Image Processing began in earnest at the beginning of the 1990s. Since then, many thousands of publications have pointed to the computational capabilities of FPGAs. During this time, FPGAs have seen the application space to which they are applicable grow in tandem with their logic densities. When investigating a particular application, researchers compare FPGAs with alternative technologies such as Digital Signal Processors (DSPs), Application-Specific Integrated Cir-cuits (ASICs), microprocessors and vector processors. The metrics for comparison depend on the needs of the application, and include such measurements as: raw performance, power consumption, unit cost, board footprint, non-recurring engineering cost, design time and design cost. The key metrics for a par-ticular application may also include ratios of these metrics, e.g. power/performance, or performance/unit cost. The work detailed in this paper compares a 90nm-process commodity microprocessor with a plat-form based around a 90nm-process FPGA, focussing on design time and raw performance. The application chosen for implementation was a minimum entropy restoration of star-field images (see [1] for an introduction), with simulated annealing used to converge towards the globally-optimum solution. This application was not chosen in the belief that it would particularly suit one technology over another, but was instead selected as being representative of a computationally intense image-processing application

    GOES-I/M ascent maneuvers from transfer orbit to station

    Get PDF
    The Geostationary Operational Environmental Satellite (GOES)-I/M station acquisition sequence consists nominally of three in-plane/out-of-plane maneuvers at apogee on the line of relative nodes and a small in-plane maneuver at perigee. Existing software to determine maneuver attitude, ignition time, and burn duration required modification to optimize the out-of-plane parts and admit the noninertial, three-axis stabilized attitude. The Modified Multiple Impulse Station Acquisition Maneuver Planning Program (SENARIO2) was developed from its predecessor, SCENARIO, to optimize the out-of-plane components of the impulsive delta-V vectors. Additional new features include commputation of short term J sub 2 perturbations and output of all premaneuver and postmaneuver orbit elements, coarse maneuver attitudes, propellant usage, spacecraft antenna aspect angles, and ground station coverage. The output data are intended to be used in the launch window computation and by the maneuver targeting computation (General Maneuver (GMAN) Program) software. The maneuver targeting computation in GMAN was modified to admit the GOES-I/M maneuver attitude. Appropriate combinations of ignition time, burn duration, and attitude enable any reasonable target orbit to be achieved

    Accelerator measurement of the energy spectra of neutrons emitted in the interaction of 3-GeV protons with several elements

    Get PDF
    The application of time of flight techniques for determining the shapes of the energy spectra of neutrons between 20 and 400 MeV is discussed. The neutrons are emitted at 20, 34, and 90 degrees in the bombardment of targets by 3 GeV protons. The targets used are carbon, aluminum, cobalt, and platinum with cylindrical cross section. Targets being bombarded are located in the internal circulating beam of a particle accelerator

    Integrating visual and tactile information in the perirhinal cortex

    Get PDF
    By virtue of its widespread afferent projections, perirhinal cortex is thought to bind polymodal information into abstract object-level representations. Consistent with this proposal, deficits in cross-modal integration have been reported after perirhinal lesions in nonhuman primates. It is therefore surprising that imaging studies of humans have not observed perirhinal activation during visualā€“tactile object matching. Critically, however, these studies did not differentiate between congruent and incongruent trials. This is important because successful integration can only occur when polymodal information indicates a single object (congruent) rather than different objects (incongruent). We scanned neurologically intact individuals using functional magnetic resonance imaging (fMRI) while they matched shapes. We found higher perirhinal activation bilaterally for cross-modal (visualā€“tactile) than unimodal (visualā€“visual or tactileā€“tactile) matching, but only when visual and tactile attributes were congruent. Our results demonstrate that the human perirhinal cortex is involved in cross-modal, visualā€“tactile, integration and, thus, indicate a functional homology between human and monkey perirhinal cortices

    Data augmentation and semi-supervised learning for deep neural networks-based text classifier

    Get PDF
    User feedback is essential for understanding user needs. In this paper, we use free-text obtained from a survey on sleep-related issues to build a deep neural networks-based text classifier. However, to train the deep neural networks model, a lot of labelled data is needed. To reduce manual data labelling, we propose a method which is a combination of data augmentation and pseudo-labelling: data augmentation is applied to labelled data to increase the size of the initial train set and then the trained model is used to annotate unlabelled data with pseudo-labels. The result shows that the model with the data augmentation achieves macro-averaged f1 score of 65.2% while using 4,300 training data, whereas the model without data augmentation achieves macro-averaged f1 score of 68.2% with around 14,000 training data. Furthermore, with the combination of pseudo-labelling, the model achieves macro-averaged f1 score of 62.7% with only using 1,400 training data with labels. In other words, with the proposed method we can reduce the amount of labelled data for training while achieving relatively good performance

    Socioeconomic deprivation and age are barriers to the online collection of patient reported outcome measures in orthopaedic patients

    Get PDF
    Introduction: Questionnaires are used commonly to assess functional outcome and satisfaction in surgical patients. Although these have in the past been administered through written forms, there is increasing interest in the use of new technology to improve the efficiency of collection. The aim of this study was to assess the availability of internet access for a group of orthopaedic patients and the acceptability of online survey completion. Methods: A total of 497 patients attending orthopaedic outpatient clinics were surveyed to assess access to the internet and their preferred means for completing follow-up questionnaires. Results: Overall, 358 patients (72%) reported having internet access. Lack of access was associated with socioeconomic deprivation and older age. Multivariable regression confirmed increased age and greater deprivation to be independently associated with lack of internet access. Out of the total group, 198 (40%) indicated a preference for assessment of outcomes via email and the internet. Conclusions: Internet access was not universal among the patients in our orthopaedic clinic. Reliance on internet collection of PROMs may introduce bias by not including results from patients in older age groups and those from the more deprived socioeconomic groups

    Stress reduction in the hospital room: applying Ulrich's theory of supportive design

    Get PDF
    Hospital rooms may exacerbate or reduce patients' stress. According to Ulrich's (1991) theory of supportive design, the hospital environment will reduce stress if it fosters perceptions of control (PC), social support (SS), and positive distraction (PD). An experimental study was conducted to test this theory. Participants were asked to imagine a hospitalization scenario and were exposed to one of 8 lists of elements that the hospital room would provide selected to facilitate PC, SS, PD, or 1 of all the possible combinations of these elements. Results confirmed Ulrich's theory. Participants expected significantly less stress in the situations where all (or only PD and SS) elements were present. Meditational analyses confirmed that the number of elements in the hospital room affects expected stress through the perceptions of how much positive distraction and social support it is perceived to provide, but not through the perception of the level of perceived control available.info:eu-repo/semantics/publishedVersio

    Integrated demand forecasting to support urban planning of low-carbon precincts: The waste scenario

    Get PDF
    Waste is a symbol of inefficiency in modern society and represents misallocated resources. This paper outlines an ongoing interdisciplinary research project entitled 'Integrated ETWW demand forecasting and scenario planning for low-carbon precincts' and reports on first findings and a literature review. This large multi-stakeholder research project has been designed to develop a shared platform for integrated ETWW (energy, transport, waste, and water) planning in a low-carbon urban future, focusing on synergies and alternative approaches to urban planning. The aim of the project is to develop a holistic integrated software tool for demand forecasting and scenario evaluation for residential precincts covering the four domains (ETWW), using identified commonalities in data requirements and model formulation. The authors of this paper are overseeing the waste domain, while other researchers in the team have expertise in the remaining domains. A major component of the project will be developing a method for including the impacts of household behaviour change in demand forecasting. In this way the overall carbon impacts of urban developments or redevelopments of existing precincts can be assessed effectively and efficiently. The resulting tool will allow urban planners, municipalities, and developers to assess the future total demands for energy, transport, waste, and water while in the planning phase. The tool will also help to assess waste management performance and materials flow in relation to energy and water consumption and travel behaviour, supporting the design and management of urban systems in different city contexts
    • ā€¦
    corecore